AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Low Perplexity Optimization

# Low Perplexity Optimization

Cardioberta.nl Clinical
Gpl-3.0
A Dutch medical domain pretrained language model developed by Amsterdam University Medical Center, specializing in medical text processing.
Large Language Model Transformers Other
C
UMCU
77
2
Kosaul V0.2
MIT
KoSaul-8B is a Korean large language model trained through continuous learning based on the Open-ko-llama3-8B model, specializing in professional applications in the legal and medical fields.
Large Language Model Transformers Korean
K
ingeol
299
3
Sroberta F
Apache-2.0
RoBERTa model trained on a 43GB dataset of Croatian and Serbian languages, supporting masked language modeling tasks.
Large Language Model Transformers Other
S
Andrija
51
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase